Nonparametric Divergence Estimation

نویسنده

  • A. The
چکیده

A. The von Mises Expansion Before diving into the auxiliary results of Section 5, let us first derive some properties of the von Mises expansion. It is a simple calculation to verify that the Gateaux derivative is simply the functional derivative of in the event that T (F ) = R (f). Lemma 8. Let T (F ) = R (f)dμ where f = dF/dμ is the Radon-Nikodym derivative, is differentiable and let G be some other distribution with density g = dG/dμ. Then: dT (G;F G) = Z @ (g(x)) @g(x) (f(x) g(x))dμ(x). (11)

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence

Asymptotically unbiased nearest-neighbor estimators for KL divergence have recently been proposed and demonstrated in a number of applications. With small sample sizes, however, these nonparametric methods typically suffer from high estimation bias due to the non-local statistics of empirical nearest-neighbor information. In this paper, we show that this non-local bias can be mitigated by chang...

متن کامل

Nonparametric Estimation of Renyi Divergence and Friends

We consider nonparametric estimation of L2, Rényi-α and Tsallis-α divergences between continuous distributions. Our approach is to construct estimators for particular integral functionals of two densities and translate them into divergence estimators. For the integral functionals, our estimators are based on corrections of a preliminary plug-in estimator. We show that these estimators achieve t...

متن کامل

Estimating divergence functionals and the likelihood ratio by penalized convex risk minimization

We develop and analyze an algorithm for nonparametric estimation of divergence functionals and the density ratio of two probability distributions. Our method is based on a variational characterization of f -divergences, which turns the estimation into a penalized convex risk minimization problem. We present a derivation of our kernel-based estimation algorithm and an analysis of convergence rat...

متن کامل

Shape Constrained Density Estimation via Penalized Rényi Divergence

Abstract. Shape constraints play an increasingly prominent role in nonparametric function estimation. While considerable recent attention has been focused on log concavity as a regularizing device in nonparametric density estimation, weaker forms of concavity constraints encompassing larger classes of densities have received less attention but offer some additional flexibility. Heavier tail beh...

متن کامل

Multivariate f-divergence Estimation With Confidence

The problem of f -divergence estimation is important in the fields of machine learning, information theory, and statistics. While several nonparametric divergence estimators exist, relatively few have known convergence properties. In particular, even for those estimators whose MSE convergence rates are known, the asymptotic distributions are unknown. We establish the asymptotic normality of a r...

متن کامل

Nonparametric Estimation of Spatial Risk for a Mean Nonstationary Random Field}

The common methods for spatial risk estimation are investigated for a stationary random field. Because of simplifying, lets distribution is known, and parametric variogram for the random field are considered. In this paper, we study a nonparametric spatial method for spatial risk. In this method, we model the random field trend by a local linear estimator, and through bias-corrected residuals, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014